Sparse factor analysis via likelihood and ℓ1-regularization

نویسندگان

  • Lipeng Ning
  • Tryphon T. Georgiou
چکیده

In this note we consider the basic problem to identify linear relations in noise. We follow the viewpoint of factor analysis (FA) where the data is to be explained by a small number of independent factors and independent noise. Thereby an approximation of the sample covariance is sought which can be factored accordingly. An algorithm is proposed which weighs in an l1-regularization term that induces sparsity of the linear model (factor) against a likelihood term that quantifies distance of the model to the sample covariance. The algorithm compares favorably against standard techniques of factor analysis. Their performance is compared first by simulation, where ground truth is available, and then on stock-market data where the proposed algorithm gives reasonable and sparser models.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

- Penalized Quantile Regression in High - Dimensional Sparse Models

We consider median regression and, more generally, quantile regression in high-dimensional sparse models. In these models the overall number of regressors p is very large, possibly larger than the sample size n, but only s of these regressors have non-zero impact on the conditional quantile of the response variable, where s grows slower than n. Since in this case the ordinary quantile regressio...

متن کامل

Remote sensing via ℓ1 minimization

We consider the problem of detecting the locations of targets in the far field by sending probing signals from an antenna array and recording the reflected echoes. Drawing on key concepts from the area of compressive sensing, we use an `1-based regularization approach to solve this, in general ill-posed, inverse scattering problem. As common in compressive sensing, we exploit randomness, which ...

متن کامل

Orbital minimization method with ℓ1 regularization

We consider a modification of the OMM energy functional which contains an ℓ 1 penalty term in order to find a sparse representation of the low-lying eigenspace of self-adjoint operators. We analyze the local minima of the modified functional as well as the convergence of the modified functional to the original functional. Algorithms combining soft thresholding with gradient descent are proposed...

متن کامل

A hierarchical sparsity-smoothness Bayesian model for ℓ0 + ℓ1 + ℓ2 regularization

Sparse signal/image recovery is a challenging topic that has captured a great interest during the last decades. To address the ill-posedness of the related inverse problem, regularization is often essential by using appropriate priors that promote the sparsity of the target signal/image. In this context, `0 + `1 regularization has been widely investigated. In this paper, we introduce a new prio...

متن کامل

Variable selection for varying coefficient models with the sparse regularization

Varying-coefficient models are useful tools for analyzing longitudinal data. They can effectively describe a relationship between predictors and responses repeatedly measured. We consider the problem of selecting variables in the varying-coefficient models via the adaptive elastic net regularization. Coefficients given as functions are expressed by basis expansions, and then parameters involved...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011